全文获取类型
收费全文 | 62892篇 |
免费 | 8728篇 |
国内免费 | 5812篇 |
专业分类
电工技术 | 5159篇 |
技术理论 | 9篇 |
综合类 | 6519篇 |
化学工业 | 3195篇 |
金属工艺 | 757篇 |
机械仪表 | 3718篇 |
建筑科学 | 2376篇 |
矿业工程 | 1586篇 |
能源动力 | 834篇 |
轻工业 | 975篇 |
水利工程 | 1493篇 |
石油天然气 | 2953篇 |
武器工业 | 806篇 |
无线电 | 9627篇 |
一般工业技术 | 4582篇 |
冶金工业 | 1236篇 |
原子能技术 | 755篇 |
自动化技术 | 30852篇 |
出版年
2024年 | 142篇 |
2023年 | 968篇 |
2022年 | 1764篇 |
2021年 | 2448篇 |
2020年 | 2474篇 |
2019年 | 2029篇 |
2018年 | 1832篇 |
2017年 | 2200篇 |
2016年 | 2487篇 |
2015年 | 2876篇 |
2014年 | 4410篇 |
2013年 | 4218篇 |
2012年 | 4661篇 |
2011年 | 4987篇 |
2010年 | 3875篇 |
2009年 | 3834篇 |
2008年 | 4236篇 |
2007年 | 4761篇 |
2006年 | 4034篇 |
2005年 | 3625篇 |
2004年 | 3121篇 |
2003年 | 2615篇 |
2002年 | 2000篇 |
2001年 | 1505篇 |
2000年 | 1272篇 |
1999年 | 916篇 |
1998年 | 693篇 |
1997年 | 590篇 |
1996年 | 479篇 |
1995年 | 443篇 |
1994年 | 351篇 |
1993年 | 253篇 |
1992年 | 181篇 |
1991年 | 179篇 |
1990年 | 137篇 |
1989年 | 109篇 |
1988年 | 87篇 |
1987年 | 69篇 |
1986年 | 61篇 |
1985年 | 85篇 |
1984年 | 63篇 |
1983年 | 71篇 |
1982年 | 65篇 |
1981年 | 43篇 |
1980年 | 26篇 |
1979年 | 36篇 |
1978年 | 13篇 |
1977年 | 20篇 |
1976年 | 13篇 |
1959年 | 7篇 |
排序方式: 共有10000条查询结果,搜索用时 234 毫秒
101.
This article presents a new semidistance for functional observations that generalizes the Mahalanobis distance for multivariate datasets. The main characteristics of the functional Mahalanobis semidistance are shown. To illustrate the applicability of this measure of proximity between functional observations, new versions of several well-known functional classification procedures are developed using the functional Mahalanobis semidistance. A Monte Carlo study and the analysis of two real examples indicate that the classification methods used in conjunction with the functional Mahalanobis semidistance give better results than other well-known functional classification procedures. This article has supplementary material online. 相似文献
102.
Monoclonal antibody‐targeted polymeric nanoparticles for cancer therapy – future prospects 下载免费PDF全文
Stephen Goodall Martina L. Jones Stephen Mahler 《Journal of chemical technology and biotechnology (Oxford, Oxfordshire : 1986)》2015,90(7):1169-1176
Although combination therapy for cancer utilising monoclonal antibodies in conjunction with chemotherapeutic drugs has resulted in increases in 5 year survivals, there nevertheless remains significant morbidity and mortality associated with systemic delivery of cytotoxic drugs. The advent of living radical polymerisation has resulted in complex and elegant nanoparticle structures that can be engineered to passively target a drug payload for cancer treatment. This presents a therapeutic modality whereby biodistribution and consequently systemic toxicity can be reduced, while focusing drug delivery to the tumour site. Nanoparticle delivery can be enhanced by attachment of a targeting monoclonal antibody fragment to facilitate tumour cell uptake through endocytosis, and so increase therapeutic efficacy. In this way, monoclonal antibodies can be supercharged by carrying a payload consisting of a cocktail of conventional chemotherapeutic drugs and siRNA. This review will focus on antibody‐targeted polymeric nanoparticles to cancer cells, and methods and technologies for synthesising such antibody‐targeted nanoparticles. The review is confined to polymeric‐based nanoparticles as these offer some advantages over liposomal nanoparticles and may circumvent some of the pitfalls in nanomedicine. Development of these antibody based polymeric nanoparticles and future directions for therapy are highlighted in this review. © 2014 Society of Chemical Industry 相似文献
103.
Estimation of longitudinal models of relationship status between all pairs of individuals (dyads) in social networks is challenging due to the complex inter-dependencies among observations and lengthy computation times. To reduce the computational burden of model estimation, a method is developed that subsamples the “always-null” dyads in which no relationships develop throughout the period of observation. The informative sampling process is accounted for by weighting the likelihood contributions of the observations by the inverses of the sampling probabilities. This weighted-likelihood estimation method is implemented using Bayesian computation and evaluated in terms of its bias, efficiency, and speed of computation under various settings. Comparisons are also made to a full information likelihood-based procedure that is only feasible to compute when limited follow-up observations are available. Calculations are performed on two real social networks of very different sizes. The easily computed weighted-likelihood procedure closely approximates the corresponding estimates for the full network, even when using low sub-sampling fractions. The fast computation times make the weighted-likelihood approach practical and able to be applied to networks of any size. 相似文献
104.
Mohamed Abdellatif 《Color research and application》2015,40(6):564-576
The spectral overlap of color‐sampling filters increases errors when using a diagonal matrix transform, for color correction and reduces color distinction. Spectral sharpening is a transformation of colors that was introduced to reduce color‐constancy errors when the colors are collected through spectrally overlapping filters. The earlier color‐constancy methods improved color precision when the illuminant color is changed, but they overlooked the color distinction. In this article, we introduce a new spectral sharpening technique that has a good compromise of color precision and distinction, based on real physical constraints. The spectral overlap is measured through observing a gray reference chart with a set of real and spectrally disjoint filters selected by the user. The new sharpening method enables to sharpen colors obtained by a sensor without knowing the camera response functions. Experiments with real images showed that the colors sharpened by the new method have good levels of color precision and distinction as well. The color‐constancy performance is compared with the data‐based sharpening method in terms of both precision and distinction. © 2014 Wiley Periodicals, Inc. Col Res Appl, 40, 564–576, 2015 相似文献
105.
106.
Cristian Podesta Natalie Coleman Amir Esmalian Faxi Yuan Ali Mostafavi 《Journal of the Royal Society Interface》2021,18(177)
This research establishes a methodological framework for quantifying community resilience based on fluctuations in a population''s activity during a natural disaster. Visits to points-of-interests (POIs) over time serve as a proxy for activities to capture the combined effects of perturbations in lifestyles, the built environment and the status of business. This study used digital trace data related to unique visits to POIs in the Houston metropolitan area during Hurricane Harvey in 2017. Resilience metrics in the form of systemic impact, duration of impact, and general resilience (GR) values were examined for the region along with their spatial distributions. The results show that certain categories, such as religious organizations and building material and supplies dealers had better resilience metrics—low systemic impact, short duration of impact, and high GR. Other categories such as medical facilities and entertainment had worse resilience metrics—high systemic impact, long duration of impact and low GR. Spatial analyses revealed that areas in the community with lower levels of resilience metrics also experienced extensive flooding. This insight demonstrates the validity of the approach proposed in this study for quantifying and analysing data for community resilience patterns using digital trace/location-intelligence data related to population activities. While this study focused on the Houston metropolitan area and only analysed one natural hazard, the same approach could be applied to other communities and disaster contexts. Such resilience metrics bring valuable insight into prioritizing resource allocation in the recovery process. 相似文献
107.
Chin‐wei Huang 《International Transactions in Operational Research》2021,28(1):470-492
The purpose of this study is to develop a modification of the model developed by Chen and Zhu in 2004. Calculating stage and overall efficiencies precisely and consistently has become a major challenge of the two‐stage DEA model. However, most other models do not calculate the optimality of intermediates. Although the model developed by Chen and Zhu measures the optimality of intermediates, the calculated efficiency scores still have some shortfalls. The modified model, named the hybrid two‐stage DEA model, fills the gap between calculating the optimality of intermediates and the consistency of overall efficiency scores. In addition to obtaining an accurate measurement for the optimality of intermediates, the model confines efficiency scores to a range from zero to one (a ratio efficiency score). In an empirical evaluation, we use data from 64 medical manufacturing firms to test the performance of the hybrid model and offer recommendations for the industry. 相似文献
108.
The solder paste printing (SPP) is a critical procedure in a surface mount technology (SMT) based assembly line, which is one of the major attributes to the defect of the printed circuit boards (PCBs). The quality of SPP is influenced by multiple factors, such as the squeegee speed, pressure, the stencil separation speed, cleaning frequency, and cleaning profile. During printing, the printer environment is dynamically varying due to the physical change of solder paste, which can result in a dynamic variation of the relationships between the printing results and the influential factors. To reduce the printing defects, it is critical to understand such dynamic relationships. This research focuses on determining the printing performance during printing by implementing a wavelet filtering-based temporal recurrent neural network. To reduce the noise factor in the solder paste inspection (SPI) data, this research applies a three-dimensional dual-tree complex wavelet transformation for low-pass noise filtering and signal reconstruction. A recurrent neural network is utilized to model the performance prediction with low noise interference. Both printing sequence and process setting information are considered in the proposed recurrent network model. The proposed approach is validated using practical dataset and compared with other commonly used data mining approaches. The results show that the proposed wavelet-based multi-dimensional temporal recurrent neural network can effectively predict the printing process performance and can be a high potential approach in reducing the defects and controlling cleaning frequency. The proposed model is expected to advance the current research in the application of smart manufacturing in surface mount technology. 相似文献
109.
Today’s information technologies involve increasingly intelligent systems, which come at the cost of increasingly complex equipment. Modern monitoring systems collect multi-measuring-point and long-term data which make equipment health prediction a “big data” problem. It is difficult to extract information from such condition monitoring data to accurately estimate or predict health statuses. Deep learning is a powerful tool for big data processing that is widely utilized in image and speech recognition applications, and can also provide effective predictions in industrial processes. This paper proposes the Long Short-term Memory Integrating Principal Component Analysis based on Human Experience (HEPCA-LSTM), which uses operational time-series data for equipment health prognostics. Principal component analysis based on human experience is first conducted to extract condition parameters from the condition monitoring system. The long short-term memory (LSTM) framework is then constructed to predict the target status. Finally, a dynamic update of the prediction model with incoming data is performed at a certain interval to prevent any model misalignment caused by the drifting of relevant variables. The proposed model is validated on a practical case and found to outperform other prediction methods. It utilizes a powerful deep learning analysis method, the LSTM, to fully process big condition monitoring series data; it effectively extracts the features involved with human experience and takes dynamic updates into consideration. 相似文献
110.
Shaojiang Zhong 《计算机、材料和连续体(英文)》2019,60(2):465-479
Based on the three-dimensional classic Chua circuit, a nonlinear circuit containing two flux-control memristors is designed. Due to the difference in the design of the characteristic equation of the two magnetron memristors, their position form a symmetrical structure with respect to the capacitor. The existence of chaotic properties is proved by analyzing the stability of the system, including Lyapunov exponent, equilibrium point, eigenvalue, Poincare map, power spectrum, bifurcation diagram et al. Theoretical analysis and numerical calculation show that this heterogeneous memristive model is a hyperchaotic five-dimensional nonlinear dynamical system and has a strong chaotic behavior. Then, the memristive system is applied to digital image and speech signal processing. The analysis of the key space, sensitivity of key parameters, and statistical character of encrypted scheme imply that this model can applied widely in multimedia information security. 相似文献